1
从提示词到管道:LangChain 编排概览
AI010Lesson 6
00:00

从提示词到管道

LLM 交互的演进

在之前的课程中,我们主要关注单个提示词的交互。然而,现实世界的应用远不止一次问答那么简单。为了构建可扩展的 AI 系统,我们必须转向编排。这包括将多个 LLM 调用串联起来,根据用户输入进行逻辑分支,并允许模型与外部数据交互。

编排的基本组件

  • LLMChain 基础单元。它将提示词模板与语言模型结合在一起。
  • 顺序链 它们允许你创建一个多步骤的工作流,其中前一步的输出成为下一步的输入。
  • 路由器链 它们充当“交通控制器”,利用 LLM 决定哪个专用子链应处理特定请求(例如,将数学问题发送给“数学链”,历史问题发送给“历史链”)。

核心原则:链式规则

链能够将多个组件——模型、提示词和记忆——整合为一个统一、连贯的应用程序。这种模块化设计确保了复杂任务可以被分解为可管理、可调试的步骤。

专业提示:调试管道
当你的管道变得复杂时,使用langchain.debug = True。这种“透视眼”功能让你能够在链的每个阶段都清晰地看到实际发送的提示词以及接收到的原始输出。
sequential_chain.py
TERMINALbash — 80x24
> Ready. Click "Run" to execute.
>
Question 1
In LangChain, what is the primary difference between a SimpleSequentialChain and a standard SequentialChain?
SimpleSequentialChain supports multiple input variables, while SequentialChain does not.
SimpleSequentialChain only supports a single input and single output flowing between steps.
Only SequentialChain can be used with ChatOpenAI models.
Challenge: Library Support Router
Design a routing mechanism for a specialized bot.
You are building a support bot for a library.

Define the logic for a RouterChain that distinguishes between "Book Recommendations" and "Operating Hours."
Step 1
Create two prompt templates: one for book suggestions and one for library schedule info.
Solution:
book_template = """You are a librarian. Recommend books based on: {input}"""
schedule_template = """You are a receptionist. Answer hours queries: {input}"""

prompt_infos = [
    {"name": "books", "description": "Good for recommending books", "prompt_template": book_template},
    {"name": "schedule", "description": "Good for answering operating hours", "prompt_template": schedule_template}
]
Step 2
Define the router_template to guide the LLM on how to classify the user's intent, and initialize the chain.
Solution:
router_template = MULTI_PROMPT_ROUTER_TEMPLATE.format(
    destinations=destinations_str
)
router_prompt = PromptTemplate(
    template=router_template,
    input_variables=["input"],
    output_parser=RouterOutputParser(),
)
router_chain = LLMRouterChain.from_llm(llm, router_prompt)

chain = MultiPromptChain(
    router_chain=router_chain,
    destination_chains=destination_chains,
    default_chain=default_chain,
    verbose=True
)